167 research outputs found

    A new metric between distributions of point processes

    Full text link
    Most metrics between finite point measures currently used in the literature have the flaw that they do not treat differing total masses in an adequate manner for applications. This paper introduces a new metric dˉ1\bar{d}_1 that combines positional differences of points under a closest match with the relative difference in total mass in a way that fixes this flaw. A comprehensive collection of theoretical results about dˉ1\bar{d}_1 and its induced Wasserstein metric dˉ2\bar{d}_2 for point process distributions are given, including examples of useful dˉ1\bar{d}_1-Lipschitz continuous functions, dˉ2\bar{d}_2 upper bounds for Poisson process approximation, and dˉ2\bar{d}_2 upper and lower bounds between distributions of point processes of i.i.d. points. Furthermore, we present a statistical test for multiple point pattern data that demonstrates the potential of dˉ1\bar{d}_1 in applications.Comment: 20 pages, 2 figure

    Zero biasing and a discrete central limit theorem

    Full text link
    We introduce a new family of distributions to approximate P(WA)\mathbb {P}(W\in A) for A{...,2,1,0,1,2,...}A\subset\{...,-2,-1,0,1,2,...\} and WW a sum of independent integer-valued random variables ξ1\xi_1, ξ2\xi_2, ...,..., ξn\xi_n with finite second moments, where, with large probability, WW is not concentrated on a lattice of span greater than 1. The well-known Berry--Esseen theorem states that, for ZZ a normal random variable with mean E(W)\mathbb {E}(W) and variance Var(W)\operatorname {Var}(W), P(ZA)\mathbb {P}(Z\in A) provides a good approximation to P(WA)\mathbb {P}(W\in A) for AA of the form (,x](-\infty,x]. However, for more general AA, such as the set of all even numbers, the normal approximation becomes unsatisfactory and it is desirable to have an appropriate discrete, nonnormal distribution which approximates WW in total variation, and a discrete version of the Berry--Esseen theorem to bound the error. In this paper, using the concept of zero biasing for discrete random variables (cf. Goldstein and Reinert [J. Theoret. Probab. 18 (2005) 237--260]), we introduce a new family of discrete distributions and provide a discrete version of the Berry--Esseen theorem showing how members of the family approximate the distribution of a sum WW of integer-valued variables in total variation.Comment: Published at http://dx.doi.org/10.1214/009117906000000250 in the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On approximation of Markov binomial distributions

    Full text link
    For a Markov chain X={Xi,i=1,2,...,n}\mathbf{X}=\{X_i,i=1,2,...,n\} with the state space {0,1}\{0,1\}, the random variable S:=i=1nXiS:=\sum_{i=1}^nX_i is said to follow a Markov binomial distribution. The exact distribution of SS, denoted LS\mathcal{L}S, is very computationally intensive for large nn (see Gabriel [Biometrika 46 (1959) 454--460] and Bhat and Lal [Adv. in Appl. Probab. 20 (1988) 677--680]) and this paper concerns suitable approximate distributions for LS\mathcal{L}S when X\mathbf{X} is stationary. We conclude that the negative binomial and binomial distributions are appropriate approximations for LS\mathcal{L}S when VarS\operatorname {Var}S is greater than and less than ES\mathbb{E}S, respectively. Also, due to the unique structure of the distribution, we are able to derive explicit error estimates for these approximations.Comment: Published in at http://dx.doi.org/10.3150/09-BEJ194 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

    Get PDF
    The probability density quantile (pdQ) carries essential information regarding shape and tail behavior of a location-scale family. Convergence of repeated applications of the pdQ mapping to the uniform distribution is investigated and new fixed point theorems are established. The Kullback-Leibler divergences from uniformity of these pdQs are mapped and found to be ingredients in power functions of optimal tests for uniformity against alternative shapes.Comment: 13 pages, 2 figures. arXiv admin note: substantial text overlap with arXiv:1605.0018

    Poisson process approximation: From Palm theory to Stein's method

    Full text link
    This exposition explains the basic ideas of Stein's method for Poisson random variable approximation and Poisson process approximation from the point of view of the immigration-death process and Palm theory. The latter approach also enables us to define local dependence of point processes [Chen and Xia (2004)] and use it to study Poisson process approximation for locally dependent point processes and for dependent superposition of point processes.Comment: Published at http://dx.doi.org/10.1214/074921706000001076 in the IMS Lecture Notes Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Stein's method, Palm theory and Poisson process approximation

    Full text link
    The framework of Stein's method for Poisson process approximation is presented from the point of view of Palm theory, which is used to construct Stein identities and define local dependence. A general result (Theorem \refimportantproposition) in Poisson process approximation is proved by taking the local approach. It is obtained without reference to any particular metric, thereby allowing wider applicability. A Wasserstein pseudometric is introduced for measuring the accuracy of point process approximation. The pseudometric provides a generalization of many metrics used so far, including the total variation distance for random variables and the Wasserstein metric for processes as in Barbour and Brown [Stochastic Process. Appl. 43 (1992) 9-31]. Also, through the pseudometric, approximation for certain point processes on a given carrier space is carried out by lifting it to one on a larger space, extending an idea of Arratia, Goldstein and Gordon [Statist. Sci. 5 (1990) 403-434]. The error bound in the general result is similar in form to that for Poisson approximation. As it yields the Stein factor 1/\lambda as in Poisson approximation, it provides good approximation, particularly in cases where \lambda is large. The general result is applied to a number of problems including Poisson process modeling of rare words in a DNA sequence.Comment: Published by the Institute of Mathematical Statistics (http://www.imstat.org) in the Annals of Probability (http://www.imstat.org/aop/) at http://dx.doi.org/10.1214/00911790400000002
    corecore